Sharp thresholds for high-dimensional and noisy recovery of sparsity using l1-constrained quadratic programming

نویسنده

  • Martin J. Wainwright
چکیده

The problem of consistently estimating the sparsity pattern of a vector β∗ ∈ R based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to establish precise conditions on the problem dimension p, the number k of non-zero elements in β∗, and the number of observations n that are necessary and sufficient for subset selection using the Lasso. For a broad class of Gaussian ensembles satisfying mutual incoherence conditions, we establish existence and compute explicit values of thresholds 0 < θl ≤ 1 ≤ θu < +∞ with the following properties: for any δ > 0, if n > 2 (θu + δ) k log(p− k), then the Lasso succeeds in recovering the sparsity pattern with probability converging to one for large problems, whereas for n < 2 (θl − δ) k log(p − k), then the probability of successful recovery converges to zero. For the special case of the uniform Gaussian ensemble, we show that θl = θu = 1, so that the precise threshold n = 2 k log(p− k) is exactly determined.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso)

The problem of consistently estimating the sparsity pattern of a vector β ∈ R based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is...

متن کامل

Sharp thresholds for high-dimensional and noisy recovery of sparsity

The problem of consistently estimating the sparsity pattern of a vector β∗ ∈ R based on observations contaminated by noise arises in various contexts, including subset selection in regression, structure estimation in graphical models, sparse approximation, and signal denoising. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering th...

متن کامل

Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using -Constrained Quadratic Programming (Lasso)

The problem of consistently estimating the sparsity pattern of a vector based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of -constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to esta...

متن کامل

A Sharp Sufficient Condition for Sparsity Pattern Recovery

Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...

متن کامل

Breaking through the thresholds: an analysis for iterative reweighted l1 minimization via the Grassmann angle framework

It is now well understood that the l1 minimization algorithm is able to recover sparse signals from incomplete measurements [2], [1], [3] and sharp recoverable sparsity thresholds have also been obtained for the l1 minimization algorithm. However, even though iterative reweighted l1 minimization algorithms or related algorithms have been empirically observed to boost the recoverable sparsity th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007